seo

Alternative Answers to Questions from the Google Live Chat Session

Last week, Google held a live chat session with a number of terrific engineers from the spam, search quality & webmaster central teams. Barry Schwartz posted a text transcsript of the chat up on SERoundTable that I read through, hoping to find some interesting nuggets of information to pass on. Unfortunately, the space given to the Googlers for responses was only 3-4 lines (due to the WebEX client used), so there wasn’t much of an opportunity to provide detail.

I thought it would be more valuable to provide the answers from Google alongside the answers I would have given. Hopefully, in this fashion, folks can compare side-by-side.


Andrea Moro – 5:08 pm
Q: What about a feedback status on Spam Report? I mean when I report spam site, I immediately get a message that the suggestion will be taked [sic] on mind, but nobody let us know when, or if the reported site or submission are right or not.

Matt Cutts – 5:15 pm
A: Andrea, normally we’re able to take a look at the reports pretty quickly. I like the idea of giving a little more feedback though.

Rand: Google can take anywhere from a day to 2 years to take action on spam reports. Generally speaking, unless the violation is egregious (or appears publicly in the media), Google likes to find scalable, algorithmic solutions to spam issues. Thus, they’ll take your report, compile it with dozens of similar reports of the same types of violations, and work from an engineering perspective to come up with a solution that will catch everyone using the tactic, not just the single site/page you reported. We’ve filed spam reports with Google through clients on numerous occasions and it’s very rare that any fast, direct action is taken. In several cases, reports that were filed a year or more ago for cloaking, keyword stuffing, and link manipulation still haven’t seen any results.

My best advice, if you’re seeking to really get a competitor booted from the index or penalized in the SERPs immediately, is to write about them on major SEO-related forums or submit a thread at Sphinn or a blog post to YOUmoz. When spam is reported publicly, Google tends to take action much more quickly and directly.

BTW – For a much better answer to a very similar question, see Susan Moskwa’s response later on, which read:

A: We usually use them to improve our algorithms, so changes may be more long-term than immediate. But we definitely take these reports into consideration.
http://googlewebmastercentral.blogspot.com/2008/06/impact-of-user-feedback-part-1.html

——–

seth holladay – 5:14 pm
Q: how do you define and penalize duplicate content? are syndication deals excluded?

Mariya Moeva – 5:15 pm
A: Hi Seth, we just did a post on duplicate content on the Google Webmaster Central blog which has a lot of useful information that may be helpful for you

Rand: Sadly, syndication deals are not excluded, but I also wouldn’t necessarily say that duplicate content is always penalized. I believe the post Mariya is referring to is here – Duplicate Content Due to Scrapers. It’s a solid discussion of the topic, and notes that most of the time, you’re not going to encounter real “penalties” for copying content, you’ll just have those pages filtered out of the results.

However, in any syndication deal, you need to carefully manage expectations. If you are licensing the content out, you need to decide whether you still want the majority of search traffic to come to your site. If so, you’ll want to write rules into the contract requiring links back to your original version, and possibly even request the use of the meta robots directive “noindex, follow” so the engines don’t get confused by another version. On the flip side, if you’re taking the content, it’s very wise to make sure you know how many other parties have licensed and posted that same content piece, whether you’re required to link back to the original source, and what rules exist on search engine indexing. Many times, new content properties or smaller content websites will experience search traffic and rush to acquire more content without thinking through the consequences. I’d strongly suggest reading these four pieces on the subject:

——–

brian vastola – 5:08 pm
Q: beside content, what are the top 3 things to do to your site to reank [sic] high ( short version)

Susan Moskwa – 5:22 pm
A: http://www.google.com/support/webmasters/bin/answer.py?answer=40349

Rand: I’m not sure how much the linked-to page on building a search-friendly site will answer a question about important metrics in the search results algorithm. If you want the opinions of some very smart SEOs, I’d check out the search ranking factors document. The real answer here is that we don’t know for certain, and Google wouldn’t be able to freely share this information in a direct, transparent, accurate way because it’s part of their proprietary operations. However, if you wanted just my personal opinion, that’s here.

——–

Tim Dineen – 5:22 pm
Q: What can we do to get the geo-target country correct when ccTLD isn’t available and Webmaster Tools declaration (3-4 months ago) did nothing.

Matt Cutts – 5:26 pm
A: Tim Dineen, I think we offered the feature in the frontend and then started supported [sic] it in the backend a little later, but I believe that we handle the geotargeting in the webmaster console pretty quickly these days.

Rand: There are a lot of other factors besides just the Google Webmaster Tools declaration that can help to put you in the right country for geo-targeting. I’d think about first using a domain name with the proper ccTLD. You mentioned that the right name wasn’t available – I’d consider some other alternatives before giving up. I’d also make sure to host the site (whatever the TLD) on an IP address in the country you’re trying to target, using the language of that country, getting links from other domains from that country, and registering with Google Local/Maps with a physical address in the country. Adding the physical address to the pages of the site and getting listings in local directories will also aid you. We’ve experienced the same problems with the Google Tools country-specific targeting and find that in general, although it suggests that it will solve the issue, there are actually a myriad of factors Google considers before they’ll “take your word” from Webmaster Tools that you’re actually intended for a country-specific audience.

——–

Jonathan Faustman – 5:21 pm
Q: Will hiding navigation items with css (that are displayed on certain pages/directories) have a negative impact when google indexes the site?

Mariya Moeva – 5:26 pm
A: Hi Jonathan, when building your site and considering hiding navigation elements, it’s best to always think, “is this good for my users?” and “would i do this if there were no search engines?

Rand: I’ve got strong opinions about the phrase “Would I do this if there were no search engines?” In fact, I believe it needs to be dropped from the engines’ lexicons. We wouldn’t register with Webmaster Tools, we wouldn’t noindex duplicate content, we wouldn’t use meta tags (and many times even title tags), we wouldn’t nofollow paid links, we wouldn’t create sitemaps, we wouldn’t build HTML alternatives to Flash and we wouldn’t worry about CSS issues or AJAX if it weren’t for search engines. Asking us if we’d do something if there were no engines is a completely useless way of thinking about SEO or website accessibility in the modern era.

That said, Jonathan, I’d say that so long as the number of elements is very small in relation to the amount of content on the page, and so long as you’re providing easy, intuitive ways for users to reach those navigational elements, you’ll probably be OK. SEOmoz itself fell under a penalty for keeping a large amount of content on a page in a display:none style, though it was in a perfectly legitimate, user-friendly way. Be cautious about how you hide content from users and what search engines might misinterpret – you can’t just build for one or the other if you want to have a successful SEO strategy. I’d have to look at your specific page to make a judgment call, but my general advice would be to walk on eggshells when it comes to hiding navigation with CSS, and do it sparsely.

——–

Peter Faber – 5:11 pm
Q: Question: Suppose you rank #1 for inanchor: intitle: and intext: But for just the keyword phrase you’re on second page. Any tips on what to do to get up for just the phrase as well?

Matt Dougherty – 5:29 pm
A: Hi Peter, echoing what John just talked about, I’d say making sure your content is useful to users is the best approach.

Rand: Matt’s answer really frustrates me, as it is almost a non-sequitur to the question. Peter – we see rankings like that happen quite a bit as well, and very frequently it has to do with how Google is ranking in the normal results vs. more modified searches. Intuition and seeing a lot of SERPs like this tells me that some element of the trust factor and domain authority algorithms aren’t coming into play as strongly with the inanchor/intitle/intext results. Generally, when I see those sorts of results, it means you’re close to achieving your rankings goals, but need to get more “trusted” links from high quality domains into your site. We also see rankings like this when a “sandbox” like effect is in place – it could be that one day, you’ll see your domain “pop” out of the lower rankings and into top positions for many of these searches (what SEOs call “breaking out of the sandbox”). So, the good news is that you’re doing a lot of things right, but the bad news is that you either need more trust juice (from high quality links) or time (to “break out”) before you’ll achieve those rankings in the normal SERPs.

——–

Wall-E The Robot – 5:30 pm
Q: I have some Amazon Associates webstores. Obviously they have the same content that Amazon has. And obviously Google sees duplicate content and dont indexes [sic] a lot of my webpages. Do you have any suggestions on how to solve this?

Mariya Moeva – 5:31 pm
A: Hi Wall-E, as long as your Amazon Associates store provides added value to users, there’s nothing you should be worried about

Rand: I’m worried that Mariya’s answer here is misleading. Wall-E has a lot to be worried about, even if the store adds value to users. First off, if the pages aren’t getting indexed, duplicate content might be one issue, but PageRank/link juice accumulation might be another. Google has a certain threshold it likes pages to reach in terms of PageRank before those pages earn the right to be in the main index. If Wall-E’s earning lots of good, high quality external links to his site, looking at the internal link structure to ensure good flow of that juice through the pages would be a good start.

On the duplicate content issue – Amazon’s always going to have the benefit of the doubt when it comes to who owns the content. The best solution here is not just to create value for users, but to stay away from making large portions of copied content indexable – using iframes or only minor snippets at a time is important. You probably also want to find automated ways to change some of the input fields you receive from Amazon. Just copying the titles, prices, categories, tags, photos, etc. could get you into dangerous territory. Google has a requirement for indexation that each page you produce meets a certain, secret threshold for unique, valuable content – you’ll need to solve that issue in order to achieve consistent rankings.

——–

Robert Longfield – 5:16 pm
Q: Further on Geotargetting. I run a multinational site with about 12 different languages being supported. We are implimenting geotrageting [sic] so users are directed to the appropriate language page for their country. The concern of some is that Google may penalize me…

John Mueller – 5:35 pm
A: I would recommend not redirecting users based on their location. This can be a bad user experience. It’s better to allow a user to choose his version based on his searches.

Rand: Such brazen hypocrisy! Google can geotarget its search results, geotarget its homepage, geotarget many of its other service pages, but heavens forbid anyone else do it. This is ridiculous. Robert – I’d say to simply do a quick check before you redirect your users. If their browser accepts cookies, feel free to drop one, re-direct them to the appropriate page and let your user data, feedback and analytics tell you whether or not it’s the best experience or not. If the browser doesn’t take cookies, drop them on an international landing page that lets them choose their country/language – this will also work well for search engine bots (which don’t accept cookies), and will be able to find all of your country-targeted content.

——–

David Thurman – 5:39 pm
Q: Does Google favor .html over say .php or do you treat all URL’s the same

Bergy Berghausen – 5:41 pm
A: A URL is a URL. As long as it’s serving content we can read, Googlebot doesn’t care what the file extension is.

Rand: I hate to be a stickler, but didn’t we just go through an episode over file extensions? Bergy really should point out that .exe, .tgz, and .tar (and .0, although that appears to be getting cleaned up) aren’t indexed by Google.

——–

Gordy Harrower – 5:40 pm
Q: Do Google tools help a site’s ranking?

Mariya Moeva – 5:43 pm
A: Hi Gordy! The most important thing to focus on is the quality of your site and the content that you provide for users

Rand: That’s a really obvious, unhelpful non-answer, and disappointing to see. I think the message being conveyed here is that Google can’t answer the question, which to my mind is a “probably.” I’ve long suspected that if you run a statistical check on all the sites that have ever registered with Webmaster Tools vs. those that haven’t, you find a much lower correlation with spam in the lower group, and thus, it might be a metric used in judging trustworthiness, even if it doesn’t have a direct impact on rankings.

As an aside, anytime Google (or any of the engines) can’t give an answer, I think it makes them look so much better when they say something like, “That’s the kind of question we can’t answer directly, because it’s in regards to our ranking system, which we need to keep private to help prevent spam.” I have so much more respect for that directness and treatment of the webmaster/question-asker as an adult and a professional than I do for the “make good sites!” malarkey. From a corporate communication standpoint, it instantly flips that goodwill switch that Google has ingrained into most of the web-using populace from “you’re awesome” to “Oh man, seriously? I thought you guys were better than that.”

 


 

None of this is to say there weren’t some really good answers from Googlers, too. In fact, I’d say it was about 1/3 terrific answers, 1/3 mediocre and 1/3 seriously lacking. I also recognize that Google has absolutely no obligation to do this, and by engaging with webmasters in a public chat like this, they’re leaps and bounds ahead of Microsoft & Yahoo!. Kudos to Google once again for their efforts to reach out.

All I really want to highlight with this post is that, for Google, or really any representative of any company or organization (US government, I’m looking in your direction), responding to your audience in direct, honest ways (even when you can’t be fully truthful or revealing) gives you far more credibility and more respect than hiding behind irrelevant links/references or repetitive, company-line jargon. Whether it’s at a conference, during an interview, in a private conversation or online, there’s a higher level worth aspiring to, and I’m both inspired by the efforts to date and left with the feeling that even more could be done by Google’s public faces & voices. Here’s to hoping.

EDIT: When writing this post, I failed to realize that WebEX was the client used for the chat, and that 3-4 lines was the maximum amount of space available for responses. I think this blunts a significant amount of my criticism (and teaches me, once again, that I should try to understand more about a situation before I antagonize). I’ve changed the title of this post to help reflect.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button